6 research outputs found

    Topics in inference and decision-making with partial knowledge

    Get PDF
    Two essential elements needed in the process of inference and decision-making are prior probabilities and likelihood functions. When both of these components are known accurately and precisely, the Bayesian approach provides a consistent and coherent solution to the problems of inference and decision-making. In many situations, however, either one or both of the above components may not be known, or at least may not be known precisely. This problem of partial knowledge about prior probabilities and likelihood functions is addressed. There are at least two ways to cope with this lack of precise knowledge: robust methods, and interval-valued methods. First, ways of modeling imprecision and indeterminacies in prior probabilities and likelihood functions are examined; then how imprecision in the above components carries over to the posterior probabilities is examined. Finally, the problem of decision making with imprecise posterior probabilities and the consequences of such actions are addressed. Application areas where the above problems may occur are in statistical pattern recognition problems, for example, the problem of classification of high-dimensional multispectral remote sensing image data

    Fall Detection with Unobtrusive Infrared Array Sensors

    Get PDF
    As the worldā€™s aging population grows, fall is becoming a major problem in public health. It is one of the most vital risks to the elderly. Many technology based fall detection systems have been developed in recent years with hardware ranging from wearable devices to ambience sensors and video cameras. Several machine learning based fall detection classifiers have been developed to process sensor data with various degrees of success. In this paper, we present a fall detection system using infrared array sensors with several deep learning methods, including long-short-term-memory and gated recurrent unit models. Evaluated with fall data collected in two different sets of configurations, we show that our approach gives significant improvement over existing works using the same infrared array sensor

    INFERENCE AND DECISION-MAKING WITH PARTIAL KNOWLEDGE

    Get PDF
    Bayesian inference and decision making requires elici1:ation of prior probabilities and sampling distributions. In many applica~tions such as exploratory data analysis, however, it may not be possible to construct the prior probabilities or the sampling distributions precisely. The objective of this thesis is to address the issues and provide some solutions to the problem of inference and decision making with imprecise or partially known priors and sampling distributions. More specifically, we will address the following three interrelated problems:( 1) how to describe imprecise priors and sampling distributions, (2) how to proceed from approximate priors and sampling distributions to approximate posteriors and posterior related quantities, and (3) how to make decisions with imprecise posterior probabilities. When the priors and/or sampling distributions are not known precisely, a natural approach is to consider a class or a neighborhood of priors, and classes or collections of sampling distributions. This approach leads naturally to consideration of upper and lower probabilities or interval-valuedl probabilities. We examine the various approaches to representation of imprecision in priors and sampling distributions. We realize that many useful classes, either for the priors or for the sampling distributions, are conveniently described in terms of 2- Choquet Capacities. We prove the Bayes\u27 Theorem (or Conditioning) for the 2-Choquet Capacity classes. Since the classes of imprecise probabilities described by the Dempster-Shafer Theory are āˆž-Choquet Capacities (and therefore 2-Choquet Capacities) our result provides another proof of the inconsistency of the Dempster\u27s rule. We address the problem of combination of various sources of information and the requirements for a reasonable combination rule. Here, we also examine the issues of independence of sources of information which is a crucial issue in combining various sources of information. We consider three methods to combine imprecise information. In method one, we utilizes thle extreme-point representations of the imprecise priors and/or the sampling distributions to obtain the extreme-points of the class of posteriors. This method is usually computationally very demanding. Therefore, we propose a simple iterative procedure that allows direct computation of not only the posterior probabilities, but also many useful posterior related quantities such as the posterior mean, the predictive density that the next observation would lie in a given set, the posterior expected loss of a decision or an action, etc. Finally,, by considering the joint space of observations and parameters, we show that if this class of joint probabilities is a 2-Choquet capacity class, we can utilize our Bayes\u27 Theorem found earlier to obtain the posterior probabilities. This last approach is computationally the most efficient method. Finally, we address the problem of decision making with imprecise posteriors obtained from imprecise priors and sampling distributions. Even ,though, allowing imprecision is a natural approach for representation of lack of information, it sometimes leads to complications in decision making and even indeterminacies. We suggest a few ad-hoc rules to resolve the remaining indeterminacies. The ultimate solution in such cases is to simply gather more data

    A Survey of Decision Tree Classifier Methodology

    Get PDF
    Decision Tree Classifiers (DTC's) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps, the most important feature of DTC's is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issue. After considering potential advantages of DTC's over single stage classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed

    Learning patterns of states in time series by genetic programming

    No full text
    A state in time series can be referred as a certain signal pattern occurring consistently for a long period of time. Learning such a pattern can be useful in automatic identiļ¬cation of the time series state for tasks like activity recognition. In this study we showcase the capability of our GP-based time series analysis method on learning diļ¬€erent types of states from multi-channel stream input. This evolutionary learning method can handle relatively complex scenarios using only raw inputs requiring no features. The method performed very well on both artiļ¬cial time series and real world human activity data. It can be competitive comparing with classical learning methods on features
    corecore